Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 95
Filtrar
1.
Nat Commun ; 15(1): 3886, 2024 May 08.
Artigo em Inglês | MEDLINE | ID: mdl-38719856

RESUMO

Machine learning provides a data-driven approach for creating a digital twin of a system - a digital model used to predict the system behavior. Having an accurate digital twin can drive many applications, such as controlling autonomous systems. Often, the size, weight, and power consumption of the digital twin or related controller must be minimized, ideally realized on embedded computing hardware that can operate without a cloud-computing connection. Here, we show that a nonlinear controller based on next-generation reservoir computing can tackle a difficult control problem: controlling a chaotic system to an arbitrary time-dependent state. The model is accurate, yet it is small enough to be evaluated on a field-programmable gate array typically found in embedded devices. Furthermore, the model only requires 25.0 ± 7.0 nJ per evaluation, well below other algorithms, even without systematic power optimization. Our work represents the first step in deploying efficient machine learning algorithms to the computing "edge."

2.
Chaos ; 34(2)2024 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-38305050

RESUMO

In this work, we combine nonlinear system control techniques with next-generation reservoir computing, a best-in-class machine learning approach for predicting the behavior of dynamical systems. We demonstrate the performance of the controller in a series of control tasks for the chaotic Hénon map, including controlling the system between unstable fixed points, stabilizing the system to higher order periodic orbits, and to an arbitrary desired state. We show that our controller succeeds in these tasks, requires only ten data points for training, can control the system to a desired trajectory in a single iteration, and is robust to noise and modeling error.

3.
Chaos ; 33(7)2023 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-37486668

RESUMO

Adaptivity is a dynamical feature that is omnipresent in nature, socio-economics, and technology. For example, adaptive couplings appear in various real-world systems, such as the power grid, social, and neural networks, and they form the backbone of closed-loop control strategies and machine learning algorithms. In this article, we provide an interdisciplinary perspective on adaptive systems. We reflect on the notion and terminology of adaptivity in different disciplines and discuss which role adaptivity plays for various fields. We highlight common open challenges and give perspectives on future research directions, looking to inspire interdisciplinary approaches.

4.
Chaos ; 32(11): 113107, 2022 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-36456323

RESUMO

Reservoir computing is a machine learning approach that can generate a surrogate model of a dynamical system. It can learn the underlying dynamical system using fewer trainable parameters and, hence, smaller training data sets than competing approaches. Recently, a simpler formulation, known as next-generation reservoir computing, removed many algorithm metaparameters and identified a well-performing traditional reservoir computer, thus simplifying training even further. Here, we study a particularly challenging problem of learning a dynamical system that has both disparate time scales and multiple co-existing dynamical states (attractors). We compare the next-generation and traditional reservoir computer using metrics quantifying the geometry of the ground-truth and forecasted attractors. For the studied four-dimensional system, the next-generation reservoir computing approach uses ∼ 1.7 × less training data, requires 10 × shorter "warmup" time, has fewer metaparameters, and has an ∼ 100 × higher accuracy in predicting the co-existing attractor characteristics in comparison to a traditional reservoir computer. Furthermore, we demonstrate that it predicts the basin of attraction with high accuracy. This work lends further support to the superior learning ability of this new machine learning algorithm for dynamical systems.

5.
Chaos ; 32(9): 093137, 2022 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-36182396

RESUMO

Forecasting the behavior of high-dimensional dynamical systems using machine learning requires efficient methods to learn the underlying physical model. We demonstrate spatiotemporal chaos prediction using a machine learning architecture that, when combined with a next-generation reservoir computer, displays state-of-the-art performance with a computational time 10- 10 times faster for training process and training data set ∼ 10 times smaller than other machine learning algorithms. We also take advantage of the translational symmetry of the model to further reduce the computational cost and training data, each by a factor of ∼10.

6.
Phys Rev E ; 104(4-2): 045307, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-34781436

RESUMO

We demonstrate that matching the symmetry properties of a reservoir computer (RC) to the data being processed dramatically increases its processing power. We apply our method to the parity task, a challenging benchmark problem that highlights inversion and permutation symmetries, and to a chaotic system inference task that presents an inversion symmetry rule. For the parity task, our symmetry-aware RC obtains zero error using an exponentially reduced neural network and training data, greatly speeding up the time to result and outperforming artificial neural networks. When both symmetries are respected, we find that the network size N necessary to obtain zero error for 50 different RC instances scales linearly with the parity-order n. Moreover, some symmetry-aware RC instances perform a zero error classification with only N=1 for n≤7. Furthermore, we show that a symmetry-aware RC only needs a training data set with size on the order of (n+n/2) to obtain such a performance, an exponential reduction in comparison to a regular RC which requires a training data set with size on the order of n2^{n} to contain all 2^{n} possible n-bit-long sequences. For the inference task, we show that a symmetry-aware RC presents a normalized root-mean-square error three orders-of-magnitude smaller than regular RCs. For both tasks, our RC approach respects the symmetries by adjusting only the input and the output layers, and not by problem-based modifications to the neural network. We anticipate that the generalizations of our procedure can be applied in information processing for problems with known symmetries.

7.
Chaos ; 31(10): 103127, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-34717323

RESUMO

Reservoir computers are powerful tools for chaotic time series prediction. They can be trained to approximate phase space flows and can thus both predict future values to a high accuracy and reconstruct the general properties of a chaotic attractor without requiring a model. In this work, we show that the ability to learn the dynamics of a complex system can be extended to systems with multiple co-existing attractors, here a four-dimensional extension of the well-known Lorenz chaotic system. We demonstrate that a reservoir computer can infer entirely unexplored parts of the phase space; a properly trained reservoir computer can predict the existence of attractors that were never approached during training and, therefore, are labeled as unseen. We provide examples where attractor inference is achieved after training solely on a single noisy trajectory.

8.
Nat Commun ; 12(1): 5564, 2021 09 21.
Artigo em Inglês | MEDLINE | ID: mdl-34548491

RESUMO

Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems using observed time-series data. Importantly, it requires very small training data sets, uses linear optimization, and thus requires minimal computing resources. However, the algorithm uses randomly sampled matrices to define the underlying recurrent neural network and has a multitude of metaparameters that must be optimized. Recent results demonstrate the equivalence of reservoir computing to nonlinear vector autoregression, which requires no random matrices, fewer metaparameters, and provides interpretable results. Here, we demonstrate that nonlinear vector autoregression excels at reservoir computing benchmark tasks and requires even shorter training data sets and training time, heralding the next generation of reservoir computing.

9.
Entropy (Basel) ; 23(8)2021 Jul 30.
Artigo em Inglês | MEDLINE | ID: mdl-34441128

RESUMO

Quantum key distribution (QKD) systems provide a method for two users to exchange a provably secure key. Synchronizing the users' clocks is an essential step before a secure key can be distilled. Qubit-based synchronization protocols directly use the transmitted quantum states to achieve synchronization and thus avoid the need for additional classical synchronization hardware. Previous qubit-based synchronization protocols sacrifice secure key either directly or indirectly, and all known qubit-based synchronization protocols do not efficiently use all publicly available information published by the users. Here, we introduce a Bayesian probabilistic algorithm that incorporates all published information to efficiently find the clock offset without sacrificing any secure key. Additionally, the output of the algorithm is a probability, which allows us to quantify our confidence in the synchronization. For demonstration purposes, we present a model system with accompanying simulations of an efficient three-state BB84 prepare-and-measure protocol with decoy states. We use our algorithm to exploit the correlations between Alice's published basis and mean photon number choices and Bob's measurement outcomes to probabilistically determine the most likely clock offset. We find that we can achieve a 95 percent synchronization confidence in only 4140 communication bin widths, meaning we can tolerate clock drift approaching 1 part in 4140 in this example when simulating this system with a dark count probability per communication bin width of 8×10-4 and a received mean photon number of 0.01.

10.
Science ; 371(6532): 889-890, 2021 02 26.
Artigo em Inglês | MEDLINE | ID: mdl-33632835

Assuntos
Lasers , Luz
11.
Chaos ; 29(12): 123108, 2019 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-31893676

RESUMO

We explore the hyperparameter space of reservoir computers used for forecasting of the chaotic Lorenz '63 attractor with Bayesian optimization. We use a new measure of reservoir performance, designed to emphasize learning the global climate of the forecasted system rather than short-term prediction. We find that optimizing over this measure more quickly excludes reservoirs that fail to reproduce the climate. The results of optimization are surprising: the optimized parameters often specify a reservoir network with very low connectivity. Inspired by this observation, we explore reservoir designs with even simpler structure and find well-performing reservoirs that have zero spectral radius and no recurrence. These simple reservoirs provide counterexamples to widely used heuristics in the field and may be useful for hardware implementations of reservoir computers.

12.
Opt Lett ; 43(16): 3806-3809, 2018 Aug 15.
Artigo em Inglês | MEDLINE | ID: mdl-30106888

RESUMO

The interference of two photons at a beam splitter is at the core of many quantum photonic technologies, such as quantum key distribution or linear-optics quantum computing. Observing high-visibility interference is challenging because of the difficulty of realizing indistinguishable single-photon sources. Here, we perform a two-photon interference experiment using phase-randomized weak coherent states with different mean photon numbers. We place a tight upper bound on the expected coincidences for the case when the incident wavepackets contain single photons, allowing us to observe the Hong-Ou-Mandel effect. We find that the interference visibility is at least as large as 0.995-0.013+0.005.

13.
Rev Sci Instrum ; 89(6): 063117, 2018 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-29960551

RESUMO

The superconducting nanowire single-photon detector (SNSPD) is a leading technology for quantum information science applications using photons, and is finding increasing uses in photon-starved classical imaging applications. Critical detector characteristics, such as timing resolution (jitter), reset time, and maximum count rate, are heavily influenced by the readout electronics that sense and amplify the photon detection signal. We describe a readout circuit for SNSPDs using commercial off-the-shelf amplifiers operating at cryogenic temperatures. Our design demonstrates a 35 ps timing resolution and a maximum count rate of over 2 × 107 counts per second, while maintaining <3 mW power consumption per channel, making it suitable for a multichannel readout.

14.
Chaos ; 28(12): 123119, 2018 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-30599514

RESUMO

Reservoir computing is a neural network approach for processing time-dependent signals that has seen rapid development in recent years. Physical implementations of the technique using optical reservoirs have demonstrated remarkable accuracy and processing speed at benchmark tasks. However, these approaches require an electronic output layer to maintain high performance, which limits their use in tasks such as time-series prediction, where the output is fed back into the reservoir. We present here a reservoir computing scheme that has rapid processing speed both by the reservoir and the output layer. The reservoir is realized by an autonomous, time-delay, Boolean network configured on a field-programmable gate array. We investigate the dynamical properties of the network and observe the fading memory property that is critical for successful reservoir computing. We demonstrate the utility of the technique by training a reservoir to learn the short- and long-term behavior of a chaotic system. We find accuracy comparable to state-of-the-art software approaches of a similar network size, but with a superior real-time prediction rate up to 160 MHz.

15.
Sci Adv ; 3(11): e1701491, 2017 11.
Artigo em Inglês | MEDLINE | ID: mdl-29202028

RESUMO

The security of conventional cryptography systems is threatened in the forthcoming era of quantum computers. Quantum key distribution (QKD) features fundamentally proven security and offers a promising option for quantum-proof cryptography solution. Although prototype QKD systems over optical fiber have been demonstrated over the years, the key generation rates remain several orders of magnitude lower than current classical communication systems. In an effort toward a commercially viable QKD system with improved key generation rates, we developed a discrete-variable QKD system based on time-bin quantum photonic states that can generate provably secure cryptographic keys at megabit-per-second rates over metropolitan distances. We use high-dimensional quantum states that transmit more than one secret bit per received photon, alleviating detector saturation effects in the superconducting nanowire single-photon detectors used in our system that feature very high detection efficiency (of more than 70%) and low timing jitter (of less than 40 ps). Our system is constructed using commercial off-the-shelf components, and the adopted protocol can be readily extended to free-space quantum channels. The security analysis adopted to distill the keys ensures that the demonstrated protocol is robust against coherent attacks, finite-size effects, and a broad class of experimental imperfections identified in our system.

16.
Opt Express ; 25(18): 21861-21876, 2017 Sep 04.
Artigo em Inglês | MEDLINE | ID: mdl-29041478

RESUMO

Commercial photon-counting modules based on actively quenched solid-state avalanche photodiode sensors are used in a wide variety of applications. Manufacturers characterize their detectors by specifying a small set of parameters, such as detection efficiency, dead time, dark counts rate, afterpulsing probability and single-photon arrival-time resolution (jitter). However, they usually do not specify the range of conditions over which these parameters are constant or present a sufficient description of the characterization process. In this work, we perform a few novel tests on two commercial detectors and identify an additional set of imperfections that must be specified to sufficiently characterize their behavior. These include rate-dependence of the dead time and jitter, detection delay shift, and "twilighting". We find that these additional non-ideal behaviors can lead to unexpected effects or strong deterioration of the performance of a system using these devices. We explain their origin by an in-depth analysis of the active quenching process. To mitigate the effects of these imperfections, a custom-built detection system is designed using a novel active quenching circuit. Its performance is compared against two commercial detectors in a fast quantum key distribution system with hyper-entangled photons and a random number generator.

17.
Phys Rev E ; 95(2-1): 022211, 2017 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-28297900

RESUMO

Biochemical systems with switch-like interactions, such as gene regulatory networks, are well modeled by autonomous Boolean networks. Specifically, the topology and logic of gene interactions can be described by systems of continuous piecewise-linear differential equations, enabling analytical predictions of the dynamics of specific networks. However, most models do not account for time delays along links associated with spatial transport, mRNA transcription, and translation. To address this issue, we have developed an experimental test bed to realize a time-delay autonomous Boolean network with three inhibitory nodes, known as a repressilator, and use it to study the dynamics that arise as time delays along the links vary. We observe various nearly periodic oscillatory transient patterns with extremely long lifetime, which emerge in small network motifs due to the delay, and which are distinct from the eventual asymptotically stable periodic attractors. For repeated experiments with a given network, we find that stochastic processes give rise to a broad distribution of transient times with an exponential tail. In some cases, the transients are so long that it is doubtful the attractors will ever be approached in a biological system that has a finite lifetime. To counteract the long transients, we show experimentally that small, occasional perturbations applied to the time delays can force the trajectories to rapidly approach the attractors.

18.
Chaos ; 26(9): 094810, 2016 09.
Artigo em Inglês | MEDLINE | ID: mdl-27781448

RESUMO

Autonomous Boolean networks are commonly used to model the dynamics of gene regulatory networks and allow for the prediction of stable dynamical attractors. However, most models do not account for time delays along the network links and noise, which are crucial features of real biological systems. Concentrating on two paradigmatic motifs, the toggle switch and the repressilator, we develop an experimental testbed that explicitly includes both inter-node time delays and noise using digital logic elements on field-programmable gate arrays. We observe transients that last millions to billions of characteristic time scales and scale exponentially with the amount of time delays between nodes, a phenomenon known as super-transient scaling. We develop a hybrid model that includes time delays along network links and allows for stochastic variation in the delays. Using this model, we explain the observed super-transient scaling of both motifs and recreate the experimentally measured transient distributions.

19.
Chaos ; 25(8): 083113, 2015 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-26328564

RESUMO

We present the design of an autonomous time-delay Boolean network realized with readily available electronic components. Through simulations and experiments that account for the detailed nonlinear response of each circuit element, we demonstrate that a network with five Boolean nodes displays complex behavior. Furthermore, we show that the dynamics of two identical networks display near-instantaneous synchronization to a periodic state when forced by a common periodic Boolean signal. A theoretical analysis of the network reveals the conditions under which complex behavior is expected in an individual network and the occurrence of synchronization in the forced networks. This research will enable future experiments on autonomous time-delay networks using readily available electronic components with dynamics on a slow enough time-scale so that inexpensive data collection systems can faithfully record the dynamics.


Assuntos
Algoritmos , Modelos Teóricos , Fatores de Tempo
20.
Artigo em Inglês | MEDLINE | ID: mdl-25768448

RESUMO

We demonstrate reservoir computing with a physical system using a single autonomous Boolean logic element with time-delay feedback. The system generates a chaotic transient with a window of consistency lasting between 30 and 300 ns, which we show is sufficient for reservoir computing. We then characterize the dependence of computational performance on system parameters to find the best operating point of the reservoir. When the best parameters are chosen, the reservoir is able to classify short input patterns with performance that decreases over time. In particular, we show that four distinct input patterns can be classified for 70 ns, even though the inputs are only provided to the reservoir for 7.5 ns.


Assuntos
Computadores , Modelos Lineares , Dinâmica não Linear , Reconhecimento Automatizado de Padrão , Fatores de Tempo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...